AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Polish pre-training

# Polish pre-training

Roberta Polish Kgr10
This is a Polish RoBERTa model pre-trained on the KGR10 corpus, currently completing about 5% of the target training duration. Incremental versions will be released as training progresses.
Large Language Model
R
clarin-pl
34
2
Bert Base Polish Uncased V1
Polish version of the BERT language model, offering both case-sensitive and case-insensitive variants, suitable for Polish natural language processing tasks.
Large Language Model Other
B
dkleczek
3,853
11
Herbert Large Cased
HerBERT is a Polish pre-trained language model based on the BERT architecture, trained using dynamic whole word masking and sentence structure objectives.
Large Language Model Other
H
allegro
1,272
6
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase